Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 23
Filtrar
Mais filtros










Base de dados
Intervalo de ano de publicação
1.
Membranes (Basel) ; 14(4)2024 Mar 25.
Artigo em Inglês | MEDLINE | ID: mdl-38668101

RESUMO

The high concentration of chloride ions in desulphurization wastewater is the primary limiting factor for its reusability. Monovalent anion selective electrodialysis (S-ED) enables the selective removal of chloride ions, thereby facilitating the reuse of desulfurization wastewater. In this study, different concentrations of NaCl and Na2SO4 were used to simulate different softened desulfurization wastewater. The effects of current density and NaCl and Na2SO4 concentration on ion flux, permselectivity (PSO42-Cl-) and specific energy consumption were studied. The results show that Selemion ASA membrane exhibits excellent permselectivity for Cl- and SO42-, with a significantly lower flux observed for SO42- compared to Cl-. Current density exerts a significant influence on ion flux; as the current density increases, the flux of SO42- also increases but at a lower rate than that of Cl-, resulting in an increase in permselectivity. When the current density reaches 25 mA/cm2, the permselectivity reaches a maximum of 50.4. The increase in NaCl concentration leads to a decrease in the SO42- flux; however, the permselectivity is reduced due to the elevated Cl-/SO42- ratio. The SO42- flux increases with the increase in Na2SO4 concentration, while the permselectivity increases with the decrease in Cl-/SO42- ratio.

2.
IEEE Trans Pattern Anal Mach Intell ; 45(12): 14709-14726, 2023 Dec.
Artigo em Inglês | MEDLINE | ID: mdl-37651495

RESUMO

Information can be quantified and expressed by uncertainty, and improving the decision level of uncertain information is vital in modeling and processing uncertain information. Dempster-Shafer evidence theory can model and process uncertain information effectively. However, the Dempster combination rule may provide counter-intuitive results when dealing with highly conflicting information, leading to a decline in decision level. Thus, measuring conflict is significant in the improvement of decision level. Motivated by this issue, this paper proposes a novel method to measure the discrepancy between bodies of evidence. First, the model of dynamic fractal probability transformation is proposed to effectively obtain more information about the non-specificity of basic belief assignments (BBAs). Then, we propose the higher-order fractal belief Rényi divergence (HOFBReD). HOFBReD can effectively measure the discrepancy between BBAs. Moreover, it is the first belief Rényi divergence that can measure the discrepancy between BBAs with dynamic fractal probability transformation. HoFBReD has several properties in terms of probability transformation as well as measurement. When the dynamic fractal probability transformation ends, HoFBReD is equivalent to measuring the Rényi divergence between the pignistic probability transformations of BBAs. When the BBAs degenerate to the probability distributions, HoFBReD will also degenerate to or be related to several well-known divergences. In addition, based on HoFBReD, a novel multisource information fusion algorithm is proposed. A pattern classification experiment with real-world datasets is presented to compare the proposed algorithm with other methods. The experiment results indicate that the proposed algorithm has a higher average pattern recognition accuracy with all datasets than other methods. The proposed discrepancy measurement method and multisource information algorithm contribute to the improvement of decision level.

3.
Appl Intell (Dordr) ; : 1-20, 2023 Jan 30.
Artigo em Inglês | MEDLINE | ID: mdl-36741743

RESUMO

The network, with some or all characteristics of scale-free, self-similarity, self-organization, attractor and small world, is defined as a complex network. The identification of significant spreaders is an indispensable research direction in complex networks, which aims to discover nodes that play a crucial role in the structure and function of the network. Since influencers are essential for studying the security of the network and controlling the propagation process of the network, their assessment methods are of great significance and practical value to solve many problems. However, how to effectively combine global information with local information is still an open problem. To solve this problem, the generalized mechanics model is further improved in this paper. A generalized mechanics model based on information entropy is proposed to discover crucial spreaders in complex networks. The influence of each neighbor node on local information is quantified by information entropy, and the interaction between each node on global information is considered by calculating the shortest distance. Extensive tests on eleven real networks indicate the proposed approach is much faster and more precise than traditional ways and state-of-the-art benchmarks. At the same time, it is effective to use our approach to identify influencers in complex networks.

4.
IEEE Trans Pattern Anal Mach Intell ; 45(2): 2054-2070, 2023 Feb.
Artigo em Inglês | MEDLINE | ID: mdl-35420983

RESUMO

In artificial intelligence systems, a question on how to express the uncertainty in knowledge remains an open issue. The negation scheme provides a new perspective to solve this issue. In this paper, we study quantum decisions from the negation perspective. Specifically, complex evidence theory (CET) is considered to be effective to express and handle uncertain information in a complex plane. Therefore, we first express CET in the quantum framework of Hilbert space. On this basis, a generalized negation method is proposed for quantum basic belief assignment (QBBA), called QBBA negation. In addition, a QBBA entropy is revisited to study the QBBA negation process to reveal the variation tendency of negation iteration. Meanwhile, the properties of the QBBA negation function are analyzed and discussed along with special cases. Then, several multisource quantum information fusion (MSQIF) algorithms are designed to support decision making. Finally, these MSQIF algorithms are applied in pattern classification to demonstrate their effectiveness. This is the first work to design MSQIF algorithms to support quantum decision making from a new perspective of "negation", which provides promising solutions to knowledge representation, uncertainty measure, and fusion of quantum information.

5.
IEEE Trans Cybern ; 52(8): 7402-7414, 2022 Aug.
Artigo em Inglês | MEDLINE | ID: mdl-33400662

RESUMO

Uncertainty is inevitable in the decision-making process of real applications. Quantum mechanics has become an interesting and popular topic in predicting and explaining human decision-making behaviors, especially regarding interference effects caused by uncertainty in the process of decision making, due to the limitations of Bayesian reasoning. In addition, complex evidence theory (CET), as a generalized Dempster-Shafer evidence theory, has been proposed to represent and handle uncertainty in the framework of the complex plane, and it is an effective tool in uncertainty reasoning. Particularly, the complex mass function, also known as a complex basic belief assignment in CET, is complex-value modeled, which is superior to the classical mass function in expressing uncertain information. CET is considered to have certain inherent connections with quantum mechanics since both are complex-value modeled and can be applied in handling uncertainty in decision-making problems. In this article, therefore, by bridging CET and quantum mechanics, we propose a new complex evidential quantum dynamical (CEQD) model to predict interference effects on human decision-making behaviors. In addition, uniform and weighted complex Pignistic belief transformation functions are proposed, which can be used effectively in the CEQD model to help explain interference effects. The experimental results and comparisons demonstrate the effectiveness of the proposed method. In summary, the proposed CEQD method provides a new perspective to study and explain the interference effects involved in human decision-making behaviors, which is significant for decision theory.


Assuntos
Teorema de Bayes , Humanos , Incerteza
6.
J Healthc Eng ; 2021: 5559529, 2021.
Artigo em Inglês | MEDLINE | ID: mdl-33777342

RESUMO

In decision-making systems, how to measure uncertain information remains an open issue, especially for information processing modeled on complex planes. In this paper, a new complex entropy is proposed to measure the uncertainty of a complex-valued distribution (CvD). The proposed complex entropy is a generalization of Gini entropy that has a powerful capability to measure uncertainty. In particular, when a CvD reduces to a probability distribution, the complex entropy will degrade into Gini entropy. In addition, the properties of complex entropy, including the nonnegativity, maximum and minimum entropies, and boundedness, are analyzed and discussed. Several numerical examples illuminate the superiority of the newly defined complex entropy. Based on the newly defined complex entropy, a multisource information fusion algorithm for decision-making is developed. Finally, we apply the decision-making algorithm in a medical diagnosis problem to validate its practicability.


Assuntos
Algoritmos , Tomada de Decisões , Diagnóstico , Entropia , Incerteza , Humanos , Probabilidade
7.
Sensors (Basel) ; 21(3)2021 Jan 27.
Artigo em Inglês | MEDLINE | ID: mdl-33513860

RESUMO

Multisource information fusion has received much attention in the past few decades, especially for the smart Internet of Things (IoT). Because of the impacts of devices, the external environment, and communication problems, the collected information may be uncertain, imprecise, or even conflicting. How to handle such kinds of uncertainty is still an open issue. Complex evidence theory (CET) is effective at disposing of uncertainty problems in the multisource information fusion of the IoT. In CET, however, how to measure the distance among complex basis belief assignments (CBBAs) to manage conflict is still an open issue, which is a benefit for improving the performance in the fusion process of the IoT. In this paper, therefore, a complex Pignistic transformation function is first proposed to transform the complex mass function; then, a generalized betting commitment-based distance (BCD) is proposed to measure the difference among CBBAs in CET. The proposed BCD is a generalized model to offer more capacity for measuring the difference among CBBAs. Additionally, other properties of the BCD are analyzed, including the non-negativeness, nondegeneracy, symmetry, and triangle inequality. Besides, a basis algorithm and its weighted extension for multi-attribute decision-making are designed based on the newly defined BCD. Finally, these decision-making algorithms are applied to cope with the medical diagnosis problem under the smart IoT environment to reveal their effectiveness.


Assuntos
Algoritmos , Internet das Coisas , Incerteza
8.
IEEE Trans Neural Netw Learn Syst ; 32(4): 1525-1535, 2021 04.
Artigo em Inglês | MEDLINE | ID: mdl-32310802

RESUMO

Evidence theory is an effective methodology for modeling and processing uncertainty that has been widely applied in various fields. In evidence theory, a number of distance measures have been presented, which play an important role in representing the degree of difference between pieces of evidence. However, the existing evidential distances focus on traditional basic belief assignments (BBAs) modeled in terms of real numbers and are not compatible with complex BBAs (CBBAs) extended to the complex plane. Therefore, in this article, a generalized evidential distance measure called the complex evidential distance (CED) is proposed, which can measure the difference or dissimilarity between CBBAs in complex evidence theory. This is the first work to consider distance measures for CBBAs, and it provides a promising way to measure the differences between pieces of evidence in a more general framework of complex plane space. Furthermore, the CED is a strict distance metric with the properties of nonnegativity, nondegeneracy, symmetry, and triangle inequality that satisfies the axioms of a distance. In particular, when the CBBAs degenerate into classical BBAs, the CED will degenerate into Jousselme et al.'s distance. Therefore, the proposed CED is a generalization of the traditional evidential distance, but it has a greater ability to measure the difference or dissimilarity between pieces of evidence. Finally, a decision-making algorithm for pattern recognition is devised based on the CED and is applied to a medical diagnosis problem to illustrate its practicability.

9.
ISA Trans ; 106: 253-261, 2020 Nov.
Artigo em Inglês | MEDLINE | ID: mdl-32622541

RESUMO

Multi-sensor data fusion (MSDF) is an efficient technology to enhance the performance of the system with the involvement of different kinds of sensors, which are broadly utilized in many fields at present. However, the data obtained from multi-sensors may have different degrees of uncertainty in the practical applications. Evidence theory is very useful to convey and manage uncertainty without a priori probability, so that it has been proverbially adopted in the information fusion fields. However, in the face of conflicting evidences, it has the possibility of producing counterintuitive results via conducting the Dempster's combination rule (DCR). To solve the above-mentioned issue, a hybrid MSDF method is exploited through integrating a newly defined evidential credibility measure of evidences based on prospect theory and the evidence theory. More specifically, a series of concepts for the evidential credibility measure are first presented, including the local credibility degree, global credibility degree, evidential credibility estimation and credibility prospect value function to comprehensively describe the award and punish grades in terms of credible evidence and incredible evidence, respectively. Based on the above researches, an appropriate weight for each evidence can be obtained. Ultimately, the weight of each evidence is leveraged to amend the primitive evidences before conducting DCR. The results attained in the experiments demonstrate that the hybrid MSDF approach is efficient and superior to handle conflict evidences as well as the application in data fusion problems.

10.
Sensors (Basel) ; 19(5)2019 Mar 07.
Artigo em Inglês | MEDLINE | ID: mdl-30866555

RESUMO

Time series data fusion is important in real applications such as target recognition based on sensors' information. The existing credibility decay model (CDM) is not efficient in the situation when the time interval between data from sensors is too long. To address this issue, a new method based on the ordered weighted aggregation operator (OWA) is presented in this paper. With the improvement to use the Q function in the OWA, the effect of time interval on the final fusion result is decreased. The application in target recognition based on time series data fusion illustrates the efficiency of the new method. The proposed method has promising aspects in time series data fusion.

11.
Entropy (Basel) ; 21(6)2019 Jun 20.
Artigo em Inglês | MEDLINE | ID: mdl-33267325

RESUMO

Dempster-Shafer (DS) evidence theory is widely applied in multi-source data fusion technology. However, classical DS combination rule fails to deal with the situation when evidence is highly in conflict. To address this problem, a novel multi-source data fusion method is proposed in this paper. The main steps of the proposed method are presented as follows. Firstly, the credibility weight of each piece of evidence is obtained after transforming the belief Jenson-Shannon divergence into belief similarities. Next, the belief entropy of each piece of evidence is calculated and the information volume weights of evidence are generated. Then, both credibility weights and information volume weights of evidence are unified to generate the final weight of each piece of evidence before the weighted average evidence is calculated. Then, the classical DS combination rule is used multiple times on the modified evidence to generate the fusing results. A numerical example compares the fusing result of the proposed method with that of other existing combination rules. Further, a practical application of fault diagnosis is presented to illustrate the plausibility and efficiency of the proposed method. The experimental result shows that the targeted type of fault is recognized most accurately by the proposed method in comparing with other combination rules.

12.
Entropy (Basel) ; 21(1)2019 Jan 15.
Artigo em Inglês | MEDLINE | ID: mdl-33266789

RESUMO

The negation of probability provides a new way of looking at information representation. However, the negation of basic probability assignment (BPA) is still an open issue. To address this issue, a novel negation method of basic probability assignment based on total uncertainty measure is proposed in this paper. The uncertainty of non-singleton elements in the power set is taken into account. Compared with the negation method of a probability distribution, the proposed negation method of BPA differs becausethe BPA of a certain element is reassigned to the other elements in the power set where the weight of reassignment is proportional to the cardinality of intersection of the element and each remaining element in the power set. Notably, the proposed negation method of BPA reduces to the negation of probability distribution as BPA reduces to classical probability. Furthermore, it is proved mathematically that our proposed negation method of BPA is indeed based on the maximum uncertainty.

13.
Entropy (Basel) ; 21(2)2019 Feb 22.
Artigo em Inglês | MEDLINE | ID: mdl-33266926

RESUMO

Failure Mode and Effects Analysis (FMEA) has been regarded as an effective analysis approach to identify and rank the potential failure modes in many applications. However, how to determine the weights of team members appropriately, with the impact factor of domain experts' uncertainty in decision-making of FMEA, is still an open issue. In this paper, a new method to determine the weights of team members, which combines evidence theory, intuitionistic fuzzy sets (IFSs) and belief entropy, is proposed to analyze the failure modes. One of the advantages of the presented model is that the uncertainty of experts in the decision-making process is taken into consideration. The proposed method is data driven with objective and reasonable properties, which considers the risk of weights more completely. A numerical example is shown to illustrate the feasibility and availability of the proposed method.

14.
Sensors (Basel) ; 18(11)2018 Nov 02.
Artigo em Inglês | MEDLINE | ID: mdl-30400158

RESUMO

Efficient matching of incoming events of data streams to persistent queries is fundamental to event stream processing systems in wireless sensor networks. These applications require dealing with high volume and continuous data streams with fast processing time on distributed complex event processing (CEP) systems. Therefore, a well-managed parallel processing technique is needed for improving the performance of the system. However, the specific properties of pattern operators in the CEP systems increase the difficulties of the parallel processing problem. To address these issues, a parallelization model and an adaptive parallel processing strategy are proposed for the complex event processing by introducing a histogram and utilizing the probability and queue theory. The proposed strategy can estimate the optimal event splitting policy, which can suit the most recent workload conditions such that the selected policy has the least expected waiting time for further processing of the arriving events. The proposed strategy can keep the CEP system running fast under the variation of the time window sizes of operators and the input rates of streams. Finally, the utility of our work is demonstrated through the experiments on the StreamBase system.

15.
Wei Sheng Yan Jiu ; 47(2): 301-306, 2018 Mar.
Artigo em Chinês | MEDLINE | ID: mdl-29903288

RESUMO

OBJECTIVE: To evaluate the dietary exposure level and health risk of antimony of children and adolescent in Hunan Province. METHODS: The content of antimony of main food were determined. The dietary exposure of children and adolescent from Hunan was calculated according to the weight and intake from Survey Report on Nutrition and Health Status of Chinese Residents Part 10: Nutrition and Health Data in2002 and combing the data of average and the 95% percentile of antimony. The health risk was evaluated compared with ADI. RESULTS: The average exposure of the population on antimony in 3 age groups were 1. 01-1. 30, 0. 85-1. 04 and 0. 83-0. 98 µg/kg BW, which exceeded the limitation of ADI( 0. 86µg/kg BW) from WHO. The average exposure of antimony decreased with age, there were significant differences in antimonyexposure between the five age groups( F = 30. 597, P < 0. 05). There was no significant differences between the same age among male and female( F = 0. 155, P > 0. 05). In medium and small-sized cities, the exposure of antimony to juveniles was slightly higher than that of three type village but non-significant( F = 0. 111, P > 0. 05) was discovered. The top three income of antimony was light-color vegetables( 52. 1%-61. 6%), dark vegetables( 21. 1%-24. 0%) and grain( 6. 0%-9. 9%). CONCLUSION: Antimony intake from food by young Children is higher than TDI, while there may be health risks.


Assuntos
Antimônio/toxicidade , Dieta , Exposição Dietética , Medição de Risco/métodos , Adolescente , Criança , Feminino , Contaminação de Alimentos , Humanos , Masculino , Verduras
16.
Sensors (Basel) ; 18(5)2018 May 09.
Artigo em Inglês | MEDLINE | ID: mdl-29747419

RESUMO

Dempster⁻Shafer evidence theory is widely applied in various fields related to information fusion. However, how to avoid the counter-intuitive results is an open issue when combining highly conflicting pieces of evidence. In order to handle such a problem, a weighted combination method for conflicting pieces of evidence in multi-sensor data fusion is proposed by considering both the interplay between the pieces of evidence and the impacts of the pieces of evidence themselves. First, the degree of credibility of the evidence is determined on the basis of the modified cosine similarity measure of basic probability assignment. Then, the degree of credibility of the evidence is adjusted by leveraging the belief entropy function to measure the information volume of the evidence. Finally, the final weight of each piece of evidence generated from the above steps is obtained and adopted to modify the bodies of evidence before using Dempster’s combination rule. A numerical example is provided to illustrate that the proposed method is reasonable and efficient in handling the conflicting pieces of evidence. In addition, applications in data classification and motor rotor fault diagnosis validate the practicability of the proposed method with better accuracy.

17.
Entropy (Basel) ; 21(1)2018 Dec 21.
Artigo em Inglês | MEDLINE | ID: mdl-33266721

RESUMO

Bayesian update is widely used in data fusion. However, the information quality is not taken into consideration in classical Bayesian update method. In this paper, a new Bayesian update with information quality under the framework of evidence theory is proposed. First, the discounting coefficient is determined by information quality. Second, the prior probability distribution is discounted as basic probability assignment. Third, the basic probability assignments from different sources can be combined with Dempster's combination rule to obtain the fusion result. Finally, with the aid of pignistic probability transformation, the combination result is converted to posterior probability distribution. A numerical example and a real application in target recognition show the efficiency of the proposed method. The proposed method can be seen as the generalized Bayesian update. If the information quality is not considered, the proposed method degenerates to the classical Bayesian update.

18.
Sensors (Basel) ; 17(11)2017 Oct 31.
Artigo em Inglês | MEDLINE | ID: mdl-29088117

RESUMO

The multi-sensor data fusion technique plays a significant role in fault diagnosis and in a variety of such applications, and the Dempster-Shafer evidence theory is employed to improve the system performance; whereas, it may generate a counter-intuitive result when the pieces of evidence highly conflict with each other. To handle this problem, a novel multi-sensor data fusion approach on the basis of the distance of evidence, belief entropy and fuzzy preference relation analysis is proposed. A function of evidence distance is first leveraged to measure the conflict degree among the pieces of evidence; thus, the support degree can be obtained to represent the reliability of the evidence. Next, the uncertainty of each piece of evidence is measured by means of the belief entropy. Based on the quantitative uncertainty measured above, the fuzzy preference relations are applied to represent the relative credibility preference of the evidence. Afterwards, the support degree of each piece of evidence is adjusted by taking advantage of the relative credibility preference of the evidence that can be utilized to generate an appropriate weight with respect to each piece of evidence. Finally, the modified weights of the evidence are adopted to adjust the bodies of the evidence in the advance of utilizing Dempster's combination rule. A numerical example and a practical application in fault diagnosis are used as illustrations to demonstrate that the proposal is reasonable and efficient in the management of conflict and fault diagnosis.

19.
Artif Intell Med ; 72: 56-71, 2016 09.
Artigo em Inglês | MEDLINE | ID: mdl-27664508

RESUMO

OBJECTIVE: For efficient and sophisticated analysis of complex event patterns that appear in streams of big data from health care information systems and support for decision-making, a triaxial hierarchical model is proposed in this paper. METHODS AND MATERIAL: Our triaxial hierarchical model is developed by focusing on hierarchies among nested event pattern queries with an event concept hierarchy, thereby allowing us to identify the relationships among the expressions and sub-expressions of the queries extensively. We devise a cost-based heuristic by means of the triaxial hierarchical model to find an optimised query execution plan in terms of the costs of both the operators and the communications between them. According to the triaxial hierarchical model, we can also calculate how to reuse the results of the common sub-expressions in multiple queries. By integrating the optimised query execution plan with the reuse schemes, a multi-query optimisation strategy is developed to accomplish efficient processing of multiple nested event pattern queries. RESULTS: We present empirical studies in which the performance of multi-query optimisation strategy was examined under various stream input rates and workloads. Specifically, the workloads of pattern queries can be used for supporting monitoring patients' conditions. On the other hand, experiments with varying input rates of streams can correspond to changes of the numbers of patients that a system should manage, whereas burst input rates can correspond to changes of rushes of patients to be taken care of. The experimental results have shown that, in Workload 1, our proposal can improve about 4 and 2 times throughput comparing with the relative works, respectively; in Workload 2, our proposal can improve about 3 and 2 times throughput comparing with the relative works, respectively; in Workload 3, our proposal can improve about 6 times throughput comparing with the relative work. CONCLUSION: The experimental results demonstrated that our proposal was able to process complex queries efficiently which can support health information systems and further decision-making.


Assuntos
Mineração de Dados , Técnicas de Apoio para a Decisão , Algoritmos , Coleta de Dados , Sistemas de Informação em Saúde , Humanos , Modelos Estatísticos , Reconhecimento Automatizado de Padrão
20.
Sci Rep ; 6: 31350, 2016 08 12.
Artigo em Inglês | MEDLINE | ID: mdl-27515908

RESUMO

With prevalent attacks in communication, sharing a secret between communicating parties is an ongoing challenge. Moreover, it is important to integrate quantum solutions with classical secret sharing schemes with low computational cost for the real world use. This paper proposes a novel hybrid threshold adaptable quantum secret sharing scheme, using an m-bonacci orbital angular momentum (OAM) pump, Lagrange interpolation polynomials, and reverse Huffman-Fibonacci-tree coding. To be exact, we employ entangled states prepared by m-bonacci sequences to detect eavesdropping. Meanwhile, we encode m-bonacci sequences in Lagrange interpolation polynomials to generate the shares of a secret with reverse Huffman-Fibonacci-tree coding. The advantages of the proposed scheme is that it can detect eavesdropping without joint quantum operations, and permits secret sharing for an arbitrary but no less than threshold-value number of classical participants with much lower bandwidth. Also, in comparison with existing quantum secret sharing schemes, it still works when there are dynamic changes, such as the unavailability of some quantum channel, the arrival of new participants and the departure of participants. Finally, we provide security analysis of the new hybrid quantum secret sharing scheme and discuss its useful features for modern applications.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...